Denoising in Representation Space via Data-Dependent Regularization for Better Representation

نویسندگان

چکیده

Despite the success of deep learning models, it remains challenging for over-parameterized model to learn good representation under small-sample-size settings. In this paper, motivated by previous work on out-of-distribution (OoD) generalization, we study problem from an OoD perspective identify fundamental factors affecting quality. We formulate a notion “out-of-feature subspace (OoFS) noise” first time, and link OoFS noise in feature extractor performance proving two theorems that demonstrate reducing is beneficial achieving better representation. Moreover, causes prove induced random initialization can be filtered out via L2 regularization. Finally, propose novel data-dependent regularizer acts weights fully connected layer reduce representations, thus implicitly forcing focus informative features rely less back-propagation. Experiments synthetic datasets show our method hard-to-learn features; filter effectively; outperforms GD, AdaGrad, KFAC. Furthermore, experiments benchmark achieves best three tasks among four.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Feature Incay for Representation Regularization

Softmax loss is widely used in deep neural networks for multi-class classification, where each class is represented by a weight vector, a sample is represented as a feature vector, and the feature vector has the largest projection on the weight vector of the correct category when the model correctly classifies a sample. To ensure generalization, weight decay that shrinks the weight norm is ofte...

متن کامل

Cluster-space representation for hyperspectral data classification

This paper presents a generalization of the hybrid supervised–unsupervised approach to image classification, and an automatic procedure for implementing it with hyperspectral data. Cluster-space representation is introduced in which clustered training data is displayed in a one-dimensional (1-D) cluster-space showing its probability distribution. This representation leads to automatic associati...

متن کامل

A Dual Space Representation for Geometric Data

Thie paper presents a representation echeme for polyhedral objects in arbitrary dimensions. Each object ie represented as the algebraic sum of convex polyhedra (cells). Each cell in turn ie represented ae the intereection of halfspacee and encoded in a vector. The notion of vertices is abandoned completely aa it ie not needed for the eet and eearch operators we intend to support. We ehow how th...

متن کامل

Making Space for Time: Issues in Space-Time Data Representation

Even with much activity over the past decade, including organized efforts on both sides of the Atlantic, the representation of both space and time in digital databases is still problematic and functional space-time systems have not gone beyond the limited prototype stage. Why is this the case? Why did it take twenty years from the ®rst GIS for the for representation and analysis in the temporal...

متن کامل

Learning Via Compact Data Representation

We present an unsupervised learning methodology derived from compact data encoding and demonstrate how to construct models of polysemy, priming, semantic disambiguation and learning using this theoretical basis. The model is capable of simulating human-like performance on artificial grammar learning.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematics

سال: 2023

ISSN: ['2227-7390']

DOI: https://doi.org/10.3390/math11102327